Goto

Collaborating Authors

 hamiltonian system


Symplectic Spectrum Gaussian Processes: Learning Hamiltonians from Noisy and Sparse Data

Neural Information Processing Systems

Hamiltonian mechanics is a well-established theory for modeling the time evolution of systems with conserved quantities (called Hamiltonian), such as the total energy of the system. Recent works have parameterized the Hamiltonian by machine learning models (e.g., neural networks), allowing Hamiltonian dynamics to be obtained from state trajectories without explicit mathematical modeling. However, the performance of existing models is limited as we can observe only noisy and sparse trajectories in practice. This paper proposes a probabilistic model that can learn the dynamics of conservative or dissipative systems from noisy and sparse data. We introduce a Gaussian process that incorporates the symplectic geometric structure of Hamiltonian systems, which is used as a prior distribution for estimating Hamiltonian systems with additive dissipation. We then present its spectral representation, Symplectic Spectrum Gaussian Processes (SSGPs), for which we newly derive random Fourier features with symplectic structures. This allows us to construct an efficient variational inference algorithm for training the models while simulating the dynamics via ordinary differential equation solvers. Experiments on several physical systems show that SSGP offers excellent performance in predicting dynamics that follow the energy conservation or dissipation law from noisy and sparse data.






Supplementary Material Spectrum Gaussian Processes Learning from Noisy and Sparse Data A Derivation of the spectral representation

Neural Information Processing Systems

The ELBO is derived from Jensen's inequality as follows: log p ( Y) ZZZ q ( X, f, w) log p ( Y, X, f, w) q ( X, f, w) d w d f d X (31) = ZZZ p ( f | w) q ( w) The inference procedure of SSGP is shown in Algorithm 1. In the experiments, we set the integration time window =1 . Update the parameters by maximizing the ELBO (13) evaluated using D . In this appendix, we describe baseline models for the experiments in Section 6. D-SymODEN can also apply to the dissipative systems. SympGPR can estimate conservative vector fields from derivative observations by considering Hamiltonian mechanics; we used finite differences for training.



Conformal Symplectic and Relativistic Optimization Guilherme Franc a

Neural Information Processing Systems

Arguably, the two most popular accelerated or momentum-based optimization methods are Nesterov's accelerated gradient and Polyaks's heavy ball, both corresponding to different discretizations of a particular second order differential equation with a friction term. Such connections with continuous-time dynamical systems have been instrumental in demystifying acceleration phenomena in optimization. Here we study structure-preserving discretizations for a certain class of dissipative (conformal) Hamiltonian systems, allowing us to analyze the sym-plectic structure of both Nesterov and heavy ball, besides providing several new insights into these methods. Moreover, we propose a new algorithm based on a dissipative relativistic system that normalizes the momentum and may result in more stable/faster optimization. Importantly, such a method generalizes both Nesterov and heavy ball, each being recovered as distinct limiting cases, and has potential advantages at no additional cost.